166 research outputs found

    On weighted structured total least squares

    No full text
    In this contribution we extend the result of (Markovsky et. al, SIAM J. of Matrix Anal. and Appl., 2005) to the case of weighted cost function. It is shown that the computational complexity of the proposed algorithm is preserved linear in the sample size when the weight matrix is banded with bandwidth that is independent of the sample size

    Linear dynamic filtering with noisy input and output

    No full text
    Estimation problems for linear time-invariant systems with noisy input and output are considered. The smoothing problem is a least norm problem. An efficient algorithm using a Riccati-type recursion is derived. The equivalence between the optimal filter and an appropriately modified Kalman filter is established. The optimal estimate of the input signal is derived from the optimal state estimate. The result shows that the noisy input/output filtering problem is not fundamentally different from the classical Kalman filtering problem

    Robust Structured Low-Rank Approximation on the Grassmannian

    Full text link
    Over the past years Robust PCA has been established as a standard tool for reliable low-rank approximation of matrices in the presence of outliers. Recently, the Robust PCA approach via nuclear norm minimization has been extended to matrices with linear structures which appear in applications such as system identification and data series analysis. At the same time it has been shown how to control the rank of a structured approximation via matrix factorization approaches. The drawbacks of these methods either lie in the lack of robustness against outliers or in their static nature of repeated batch-processing. We present a Robust Structured Low-Rank Approximation method on the Grassmannian that on the one hand allows for fast re-initialization in an online setting due to subspace identification with manifolds, and that is robust against outliers due to a smooth approximation of the p\ell_p-norm cost function on the other hand. The method is evaluated in online time series forecasting tasks on simulated and real-world data

    Algorithms and literate programs for weighted low-rank approximation with missing data

    No full text
    Linear models identification from data with missing values is posed as a weighted low-rank approximation problem with weights related to the missing values equal to zero. Alternating projections and variable projections methods for solving the resulting problem are outlined and implemented in a literate programming style, using Matlab/Octave's scripting language. The methods are evaluated on synthetic data and real data from the MovieLens data sets

    Closed-loop data-driven simulation

    Full text link

    A Recursive Restricted Total Least-squares Algorithm

    Get PDF
    We show that the generalized total least squares (GTLS) problem with a singular noise covariance matrix is equivalent to the restricted total least squares (RTLS) problem and propose a recursive method for its numerical solution. The method is based on the generalized inverse iteration. The estimation error covariance matrix and the estimated augmented correction are also characterized and computed recursively. The algorithm is cheap to compute and is suitable for online implementation. Simulation results in least squares (LS), data least squares (DLS), total least squares (TLS), and RTLS noise scenarios show fast convergence of the parameter estimates to their optimal values obtained by corresponding batch algorithms
    corecore